Gradient descent is an optimization algorithm used to find the minimum of a given objective function. It does this by iteratively making small changes to the input parameters, known as ‘take steps’, in a direction such that the reduction of the objective function is maximized. Gradient descent is a powerful tool that is used extensively in the fields of machine learning, deep learning and other areas of optimization.
See also: neural network, self-organization, free will, cognitive science